The Error Entropy Minimization Algorithm for Neural Network Classification
نویسندگان
چکیده
One way of using the entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically, one is the output of the learning system and the other is the target. This framework has been used for regression. In this paper we show how to use the minimization of the entropy of the error for classification. The minimization of the entropy of the error implies a constant value for the errors. This, in general, does not imply that the value of the errors is zero. In regression, this problem is solved by making a shift of the final result such that it’s average equals the average value of the desired target. We prove that, under mild conditions, this algorithm, when used in a classification problem, makes the error converge to zero and can thus be used in classification.
منابع مشابه
A New Method for Intrusion Detection Using Genetic Algorithm and Neural Network
The article attempts to have neural network and genetic algorithm techniques present a model for classification on dataset. The goal is design model can the subject acted a firewall in network and this model with compound optimized algorithms create reliability and accuracy and reduce error rate couse of this is article use feedback neural network and compared to previous methods increase a...
متن کاملA New Method for Intrusion Detection Using Genetic Algorithm and Neural Network
The article attempts to have neural network and genetic algorithm techniques present a model for classification on dataset. The goal is design model can the subject acted a firewall in network and this model with compound optimized algorithms create reliability and accuracy and reduce error rate couse of this is article use feedback neural network and compared to previous methods increase a...
متن کاملNeural network classification using Shannon's entropy
The last years have witnessed an increasing attention to entropy-based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of entropic cost functions. We propose a new type of neural network classifiers with multilayer perceptron (MLP) architecture, but where the usual mean square error minimization principle is substituted by the minimizatio...
متن کاملNeural Network Classification Using Error Entropy Minimization
One way of using the entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically, one is the output of the learning system and the other is the target. This framework has been used for regression. In this paper we show how to use the minimization of the entropy of the error for classification. The minimization of the entropy of the error implies...
متن کاملOptimization of the Error Entropy Minimization Algorithm for Neural Network Classification
One way of using entropy criteria in learning systems is to minimize the entropy of the error between the output of the learning system and the desired targets. In our last work, we introduced the Error Entropy Minimization (EEM) algorithm for neural network classification. There are some sensible aspects in the optimization of the EEM algorithm: the size of the Parzen Window (smoothing paramet...
متن کامل